Stock selection with principal component analysis
نویسندگان
چکیده
منابع مشابه
Variable Selection and Principal Component Analysis
In most of applied disciplines, many variables are sometimes measured on each individual, which result a huge data set consisting of large number of variables, say p [Sharma (1996)]. Using this collected data set in any statistical analysis may cause several troubles. The dimensionality of the data set can often be reduced, without disturbing the main features of the whole data set by Principal...
متن کاملSparse Principal Component Analysis Incorporating Stability Selection
Principal component analysis (PCA) is a popular dimension reduction method that approximates a numerical data matrix by seeking principal components (PC), i.e. linear combinations of variables that captures maximal variance. Since each PC is a linear combination of all variables of a data set, interpretation of the PCs can be difficult, especially in high-dimensional data. In order to find ’spa...
متن کاملHyperparameter Selection in Kernel Principal Component Analysis
In kernel methods, choosing a suitable kernel is indispensable for favorable results. No well-founded methods, however, have been established in general for unsupervised learning. We focus on kernel Principal Component Analysis (kernel PCA), which is a nonlinear extension of principal component analysis and has been used electively for extracting nonlinear features and reducing dimensionality. ...
متن کاملRobust Principal Component Analysis with Adaptive Selection for Tuning Parameters
The present paper discusses robustness against outliers in a principal component analysis (PCA). We propose a class of procedures for PCA based on the minimum psi principle, which unifies various approaches, including the classical procedure and recently proposed procedures. The reweighted matrix algorithm for off-line data and the gradient algorithm for on-line data are both investigated with ...
متن کاملPrincipal Component Projection Without Principal Component Analysis
We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Journal of Investment Strategies
سال: 2016
ISSN: 2047-1238
DOI: 10.21314/jois.2016.067